AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
BERT Distillation

# BERT Distillation

Bert Base Uncased Sst2 Distilled
Apache-2.0
This model is a fine-tuned version of bert-base-uncased on an unknown dataset, primarily used for text classification tasks.
Text Classification Transformers
B
doyoungkim
106
0
Bert Base Uncased Squad1.1 Block Sparse 0.20 V1
MIT
This is a pruned and optimized BERT Q&A model, retaining 38.1% of the original model's weights, fine-tuned on the SQuAD1.1 dataset, supporting English Q&A tasks.
Question Answering System Transformers English
B
madlag
15
0
Distilbert Base Uncased Finetuned Squad
Apache-2.0
This model is a fine-tuned version of distilbert-base-uncased on the SQuAD question answering dataset, suitable for question answering tasks.
Question Answering System Transformers
D
gokulkarthik
20
0
Dynamic Tinybert
Apache-2.0
Dynamic-TinyBERT is an efficient question answering model that improves inference efficiency through dynamic sequence length reduction, achieving up to 3.3x speedup while maintaining high accuracy.
Question Answering System Transformers English
D
Intel
2,184
78
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase